We provide here the implementations of two new stochastic Newton-type algorithms to solve the finite-sum minimization problem in Machine learning, Stochastic Average Newton Method (SAN) and Stochastic Average Newton Alternative (SANA). Currently our implementations support only generalized linear models (GLMs). For loss function, logistic loss is provided for binary classification problems, pseudo-Huber loss and L2 loss are supported for regression problems. To compare our algorithms with the state-of-the-art algorithms for solving GLMs, we provide our codes for SAG and SVRG. We also provide our codes for Stochastic Newton as a bench mark for second order incremental algorithms.
- numpy >= 1.13
- matplotlib >= 2.1
- scikit-learn >= 0.19
-
Preparing dataset.
The datasets we used in our experiments are downloaded from LibSVM. They were put in the folder './datasets/'.
-
Editing the
config.py
file.You need to edit some arguments to run the codes according to your experimental settings. See the detailed explanations in this file.
-
Editing the
run.sh
file to specify the datasets you want to run. -
Execute
chmod +x run.sh
in your terminal to make the script executable. -
Running
sh ./run.sh
in your terminal.